Exploring Self-Distillation Based Relational Reasoning Training for Document-Level Relation Extraction
نویسندگان
چکیده
Document-level relation extraction (RE) aims to extract relational triples from a document. One of its primary challenges is predict implicit relations between entities, which are not explicitly expressed in the document but can usually be extracted through reasoning. Previous methods mainly implicitly model reasoning interaction among entities or entity pairs. However, they suffer two deficiencies: 1) often consider only one pattern, coverage on limited; 2) do process In this paper, deal with first problem, we propose document-level RE module that contains core unit, multi-head self-attention unit. This unit variant conventional and utilizes four attention heads common patterns, respectively, cover more than previous methods. Then, address second issue, self-distillation training framework, branches sharing parameters. branch, randomly mask some pair feature vectors document, then train our infer their by exploiting information other related By doing so, because additional masking operation used during testing, it causes an input gap testing scenarios, would hurt performance. To reduce gap, perform supervised without branch utilize Kullback-Leibler divergence loss minimize difference predictions branches. Finally, conduct comprehensive experiments three benchmark datasets, experimental results demonstrate consistently outperforms all competitive baselines. Our source code available at https://github.com/DeepLearnXMU/DocRE-SD
منابع مشابه
Self-Crowdsourcing Training for Relation Extraction
One expensive step when defining crowdsourcing tasks is to define the examples and control questions for instructing the crowd workers. In this paper, we introduce a self-training strategy for crowdsourcing. The main idea is to use an automatic classifier, trained on weakly supervised data, to select examples associated with high confidence. These are used by our automatic agent to explain the ...
متن کاملExploiting information extraction annotations for document retrieval in distillation tasks
Information distillation aims to extract relevant pieces of information related to a given query from massive, possibly multilingual, audio and textual document sources. In this paper, we present our approach for using information extraction annotations to augment document retrieval for distillation. We take advantage of the fact that some of the distillation queries can be associated with anno...
متن کاملLearning Relational Dependency Networks for Relation Extraction
We consider the task of KBP slot filling – extracting relation information from newswire documents for knowledge base construction. We present our pipeline, which employs Relational Dependency Networks (RDNs) to learn linguistic patterns for relation extraction. Additionally, we demonstrate how several components such as weak supervision, word2vec features, joint learning and the use of human a...
متن کاملAdversarial Training for Relation Extraction
Adversarial training is a mean of regularizing classification algorithms by generating adversarial noise to the training data. We apply adversarial training in relation extraction within the multi-instance multi-label learning framework. We evaluate various neural network architectures on two different datasets. Experimental results demonstrate that adversarial training is generally effective f...
متن کاملLearning Relational Structure for Temporal Relation Extraction
Recently there has been a lot of interest in using Statistical Relational Learning (SRL) models for Information Extraction (IE). One of the important IE tasks is extraction of temporal relations between events and time expressions (timex). SRL methods that use hand-written rules have been proposed for various IE tasks. In contrast, we propose an approach that employs structure learning in SRL t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i11.26635